JS-MA: A Jensen-Shannon Divergence Based Method for Mapping Genome-Wide Associations on Multiple Diseases

نویسندگان

چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Non-parametric Jensen-Shannon Divergence

Quantifying the difference between two distributions is a common problem in many machine learning and data mining tasks. What is also common in many tasks is that we only have empirical data. That is, we do not know the true distributions nor their form, and hence, before we can measure their divergence we first need to assume a distribution or perform estimation. For exploratory purposes this ...

متن کامل

A Graph Embedding Method Using the Jensen-Shannon Divergence

Riesen and Bunke recently proposed a novel dissimilarity based approach for embedding graphs into a vector space. One drawback of their approach is the computational cost graph edit operations required to compute the dissimilarity for graphs. In this paper we explore whether the Jensen-Shannon divergence can be used as a means of computing a fast similarity measure between a pair of graphs. We ...

متن کامل

A Note on Bound for Jensen-Shannon Divergence by Jeffreys

We present a lower bound on the Jensen-Shannon divergence by the Jeffrers’ divergence when pi ≥ qi is satisfied. In the original Lin's paper [IEEE Trans. Info. Theory, 37, 145 (1991)], where the divergence was introduced, the upper bound in terms of the Jeffreys was the quarter of it. In view of a recent shaper one reported by Crooks, we present a discussion on upper bounds by transcendental fu...

متن کامل

Nonextensive Generalizations of the Jensen-Shannon Divergence

Convexity is a key concept in information theory, namely via the many implications of Jensen’s inequality, such as the non-negativity of the Kullback-Leibler divergence (KLD). Jensen’s inequality also underlies the concept of Jensen-Shannon divergence (JSD), which is a symmetrized and smoothed version of the KLD. This paper introduces new JSD-type divergences, by extending its two building bloc...

متن کامل

A Jensen-Shannon Divergence Kernel for Directed Graphs

Recently, kernel methods have been widely employed to solve machine learning problems such as classification and clustering. Although there are many existing graph kernel methods for comparing patterns represented by undirected graphs, the corresponding methods for directed structures are less developed. In this paper, to fill this gap in the literature we exploit the graph kernels and graph co...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Frontiers in Genetics

سال: 2020

ISSN: 1664-8021

DOI: 10.3389/fgene.2020.507038